Quantized incremental algorithms for distributed optimization

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Acceleration Method Combining Broadcast and Incremental Distributed Optimization Algorithms

This paper considers a networked system consisting of an operator, which manages the system, and a finite number of subnetworks with all users, and studies the problem of minimizing the sum of the operator’s and all users’ objective functions over the intersection of the operator’s and all users’ constraint sets. When users in each subnetwork can communicate with each other, they can implement ...

متن کامل

Incremental Stochastic Subgradient Algorithms for Convex Optimization

This paper studies the effect of stochastic errors on two constrained incremental subgradient algorithms. The incremental subgradient algorithms are viewed as decentralized network optimization algorithms as applied to minimize a sum of functions, when each component function is known only to a particular agent of a distributed network. First, the standard cyclic incremental subgradient algorit...

متن کامل

Optimal Algorithms for Distributed Optimization

In this paper, we study the optimal convergence rate for distributed convex optimization problems in networks. We model the communication restrictions imposed by the network as a set of affine constraints and provide optimal complexity bounds for four different setups, namely: the function $F(\xb) \triangleq \sum_{i=1}^{m}f_i(\xb)$ is strongly convex and smooth, either strongly convex or smooth...

متن کامل

Distributed Algorithms for Source Localization Using Quantized Sensor Readings

We consider sensor-based distributed source localization applications, where sensors transmit quantized data to a fusion node, which then produces an estimate of the source location. For this application, the goal is to minimize the amount of information that the sensor nodes have to exchange in order to attain a certain source localization accuracy. We propose an iterative quantizer design alg...

متن کامل

Analysis of Algorithms for Distributed Optimization

Gradient descent (GD) and coordinate descent (CD) are two competing families of optimization algorithms used to solve numerous machine learning tasks. The proliferation of large, web-scale datasets has led researchers to investigate minibatch variants of these algorithms in parallel and distributed settings. However, there is a lack of consensus in the community about the relative merits of the...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Journal on Selected Areas in Communications

سال: 2005

ISSN: 0733-8716

DOI: 10.1109/jsac.2005.843546